Course Registration System

Requirements Attributes Guidelines

 

Version 1.0

 

Revision History

Date

Version

Description

Author

08/Jan/1999 1.0 Initial Release Simon Jones
 
 
 
 
 
 
 
 
 
 
 
 

Table of Contents

  1. Objectives
  2. Scope
  3. References
  4. Requirement Attributes
    1. Attributes for Product Requirements
      1. Status
      2. Benefit
      3. Effort
      4. Risk
      5. Target Release
      6. Assigned To
    2. Attributes for Use Case Requirements
      1. Status
      2. Priority
      3. Effort Estimate
      4. Technical Risk
      5. Target Development Iteration
      6. Assigned To
      7. Rose model
    3. Attributes for Test Cases
      1. Test Status
      2. Build Number
      3. Tested By
      4. Date Tested
      5. Test Notes
  5. Traceability Criteria
    1. Criteria for Product Requirements
    2. Criteria for Use Case Requirements
    3. Criteria for Test Requirements

Requirements Attributes Guidelines

  1. Objectives
  2. The Requirements Attributes Guidelines identifies and describes the attributes that will be used for managing the requirements for the C-Registration System. In addition, this document outlines the requirements traceability that will be maintained on the project during development.

    The attributes assigned to each requirement will be used to manage the software development and to prioritize the features for each release.

    The objective of requirements traceability is to reduce the number of defects found late in the development cycle. Ensuring all product requirements are captured in the software requirements, design, and test cases improves the quality of the product.

  3. Scope
  4. The attribute and traceability guidelines in this document apply to the product requirements, software requirements, and test requirements for the C-Registration System.

  5. References

Applicable references are:

    1. Vision Document for the C-Registration System, WyIT387, V1.0, 1998, Wylie College IT.
    2. Supplementary Specification for the C-Registration System, WyIT400, Draft, 1998, Wylie College IT.
    3. Use Case Spec - Close Registration, WyIT403, Draft, 1998, Wylie College IT.
    4. Use Case Spec – Login, WyIT401, Draft, 1998, Wylie College IT.
    5. Use Case Spec - Maintain Professor Info, WyIT407, Draft, 1998, Wylie College IT.
    6. Use Case Spec - Register for Courses, WyIT402, Draft, 1998, Wylie College IT.
    7. Use Case Spec - Select Courses to Teach, WyIT405, Draft, 1998, Wylie College IT.
    8. Use Case Spec - Maintain Student Info, WyIT408, Draft, 1998, Wylie College IT.
    9. Use Case Spec - Submit Grades, WyIT409, Draft, 1998, Wylie College IT.
    10. Use Case Spec - View Report Card, WyIT410, Draft, 1998, Wylie College IT.
    11. Test Plan for the C-Registration System, WyIT<TBD>, Wylie College IT.
  1. Requirement Attributes

This section identifies the type of requirements that will be managed and the attributes that will be used for each requirement type. The C-Registration System will identify and manage the following requirement types:

    • Product Requirements,
    • Use Case Requirements, and
    • Test Cases.

This project is planning to use RequisitePro for managing the requirements. The attributes described in this section will be defined in RequisitePro. RequisitePro will enable each requirement tagged in the Word document to be described in terms of the attributes. RequisitePro will also be used to trace requirements as described in Section 5 of this document.

    1. Attributes for Product Requirements
    2. The product requirements (defined in the Vision Document [1]) will be managed using the attributes defined in this section. These attributes are useful for managing the development effort and for prioritizing the features targeted for various releases.

      1. Status
      2. Set after the product requirements are defined in the Vision Document. Tracks progress during definition of the project baseline.

        Proposed Used to describe features that are under discussion but have not yet been reviewed and approved.
        Approved Capabilities that are deemed useful and feasible and have been approved for implementation.
        Incorporated Features incorporated into the product baseline at a specific point in time.

         

      3. Benefit
      4. Set by Marketing, the product manager or the business analyst. All requirements are not created equal. Ranking requirements by their relative benefit to the end user opens a dialogue with customers, analysts and members of the development team. Used in managing scope and determining development priority.

        The product manager will specify the benefit of each proposed feature in terms of critical, important, or useful.

        Critical Essential features. Failure to implement means the system will not meet customer needs. All critical features must be implemented in the release or the schedule will slip.
        Important Features important to the effectiveness and efficiency of the system for most applications. The functionality cannot be easily provided in some other way. Lack of inclusion of an important feature may affect customer or user satisfaction, or even revenue, but release will not be delayed due to lack of any important feature.
        Useful Features that are useful in less typical applications, will be used less frequently, or for which reasonably efficient workarounds can be achieved. No significant revenue or customer satisfaction impact can be expected if such an item is not included in a release.
      5. Effort
      6. Set by the development team. Because some features require more time and resources than others, estimating the number of team or person-weeks, lines of code required or function points, for example, is the best way to gauge complexity and set expectations of what can and cannot be accomplished in a given time frame. Used in managing scope and determining development priority.

        For the C-Registration System, effort will be defined in person days of effort.

      7. Risk
      8. Set by development team based on the probability the project will experience undesirable events, such as cost overruns, schedule delays or even cancellation. Most project managers find categorizing risks as high, medium, and low sufficient, although finer gradations are possible. Risk can often be assessed indirectly by measuring the uncertainty (range) of the projects teams schedule estimate.

        On the C-Registration Project, the Project Manager will define risk in terms of high, medium, and low.

        High The impact of the risk combined with the probability of the risk occurring is high.
        Medium The impact of the risk is less severe and the probability of the risk occurring is less.
        Low The impact of the risk is minimal and the probability of the risk occurring is low.
      9. Target Release
      10. Records the intended product version in which the feature will first appear. This field can be used to allocate features from a Vision Document into a particular baseline release. When combined with the status field, the project team can propose, record and discuss various features of the release without committing them to development. Only features whose Status is set to Incorporated and whose Target Release is defined will be implemented. When scope management occurs, the Target Release Version Number can be increased so the item will remain in the Vision Document but will be scheduled for a later release.

        For the C-Registration System, the features are being planned out for the first 3 releases.

        R 1.0 Scheduled for C-Registration Release 1.0 (Oct 1999)
        R 2.0 Scheduled for C-Registration Release 2.0 (Oct 2000)
        R 3.0 Scheduled for C-Registration Release 3.0 (Oct 2001)
        Other Scheduled for future releases TBD
      11. Assigned To

      In many projects, features will be assigned to "feature teams" responsible for further elicitation, writing the software requirements and implementation. This simple pull down list will help everyone on the project team better understand responsibilities.

    3. Attributes for Use Case Requirements
    4. The use case requirements (defined in the C-Registration Use Case Specifications [3 – 10] and the Supplementary Specification [2]) will be managed using the attributes defined in this section. These attributes are useful for managing the development effort, determining iteration content, and for associating use cases with their specific Rose models.

      1. Status
      2. Set after the analysis has drafted the use cases. Tracks progress of the development of the use case from initial drafting of the use case through to final validation of the use case.

        Proposed Use Cases which have been identified though not yet reviewed and approved.
        Approved Use Cases approved for further design and implementation.
        Validated Use Cases which have been validated in a system test.
      3. Priority
      4. Set by the Project Manager. Determines the priority of the use case in terms of the importance of assigning development resources to the use case and monitoring the progress of the use case development. Priority is typically based upon the perceived benefit to the user, the planned release, the planned iteration, complexity of the use case (risk), and effort to implement the use case.

        High Use Case is a high priority relative to ensuring the implementation of the use case is monitored closely and that resources are assigned appropriately to the task.
        Medium Use Case is medium priority relative to other use cases.
        Low Use Case is low priority. Implementation of this use case is less critical and may be relayed or rescheduled to subsequent iterations or releases.
      5. Effort Estimate
      6. Set by the development team. Because some use cases require more time and resources than others, estimating the number of team or person-weeks, lines of code required or function points, for example, is the best way to gauge complexity and set expectations of what can and cannot be accomplished in a given time frame. Used in managing scope and determining development priority. The Project Manager uses these effort estimates to determine the project schedule and to effectively plan the resourcing of the tasks.

        C-Registration Project estimates effort in Person Days (assume 7.5 hours in a workday).

      7. Technical Risk
      8. Set by development team based on the probability the use case will experience undesirable events, such as effort overruns, design flaws, high number of defects, poor quality, poor performance, etc. Undesirable events such as these are often the result of poorly understood or defined requirements, insufficient knowledge, lack of resources, technical complexity, new technology, new tools, or new equipment.

        C-Registration Project will categorize the technical risks of each use case as high, medium, or low.

        High The impact of the risk combined with the probability of the risk occurring is high.
        Medium The impact of the risk is less severe and the probability of the risk occurring is less.
        Low The impact of the risk is minimal and the probability of the risk occurring is low.
      9. Target Development Iteration
      10. Records the development iteration in which the use case will be implemented. It is anticipated that the development for each release will be performed over several development iterations during the Construction Phase of the project.

        The iteration number assigned to each use case is used by the Project Manager to plan the activities of the project team.

        The current plan is for the C-Reg Project to undergo 3-4 iterations during the Construction Phase. In each iteration the selected set of use cases will be coded and tested.

        Iteration E-1 Scheduled for Elaboration Phase, Iteration 1
        Iteration C-1 Scheduled for Construction Phase, Iteration 1
        Iteration C-2 Scheduled for Construction Phase, Iteration 2
        Iteration C-3 Scheduled for Construction Phase, Iteration 3
      11. Assigned To
      12. Use cases are assigned to either individuals or development teams for further analysis, design, and implementation. A simple pull down list will help everyone on the project team better understand responsibilities.

      13. Rose model

      Identifies the Rose use case model associated with the use case requirement.

    5. Attributes for Test Cases
    6. The test cases (defined in the Test Plan for the C-Registration System [11]) will be planned and tracked using the attributes defined in this section.

      1. Test Status
      2. Set by the Test Lead. Tracks status of each test case.

        Untested

        Test Case has not been performed.

        Failed

        Test has been conducted and failed.

        Conditional Pass

        Test has been completed with problems. Test assigned status of Pass upon the condition that certain actions are completed.

        Pass

        Test has completed successfully.

      3. Build Number
      4. Records the system build in which the specific test case will be verified.

        The C-Registration System will be implemented in several iterations of the Construction Phase. An iteration typically requires 1-3 builds.

        Build A Test Case scheduled for System Build A
        Build B Test Case scheduled for System Build B
        Build C Test Case scheduled for System Build C
        Build D Test Case scheduled for System Build D
        Build E Test Case scheduled for System Build E
        Build F Test Case scheduled for System Build F
        Build G Test Case scheduled for System Build G
        Build H Test Case scheduled for System Build H
        Build I Test Case scheduled for System Build I
        Build J Test Case scheduled for System Build J
        Build K Test Case scheduled for System Build K
        Build L Test Case scheduled for System Build L
      5. Tested By
      6. Individual assigned to perform and verify the test case. This simple pull down list will help everyone on the project team better understand responsibilities.

         

      7. Date Tested
      8. Planned test date or actual test date.

      9. Test Notes

Any notes associated with planning or executing the test.

  1. Traceability Criteria
    1. Criteria for Product Requirements
    2. The product requirements defined in the Vision Document [1] will be traced to the corresponding use case or supplementary requirements in the Use Case Specifications [3],[4],[5],[6],[7],[8],[9],[10] and the Supplementary Specification [2].

      Each product requirement traces to 1 or more use case requirements and supplementary requirements.

    3. Criteria for Use Case Requirements
    4. The use case requirements defined in the Use Case Specifications [3],[4],[5],[6],[7],[8],[9],[10] and the Supplementary Specification [2] will be traced to the corresponding test cases specified in the Test Plan [11].

      Each use case requirement traces to 1 or more system test cases.

    5. Criteria for Test Requirements

The test cases specified in the Test Plan [11]are traced back to the product requirements [1] and use case requirements [2],[3],[4],[5],[6],[7],[8],[9],[10] which are being verified by the particular test case.

A test case may trace back to 1 or more product and use case requirements. In the case where the test case is verifying a derived requirement or the design, the test case may have no traceability back to the original product requirements or use case requirements.



Copyright  © 1987 - 2000 Rational Software Corporation

Display Rational Unified Process using frames

Rational Unified Process